32 research outputs found
Dynamic message-passing equations for models with unidirectional dynamics
Understanding and quantifying the dynamics of disordered out-of-equilibrium
models is an important problem in many branches of science. Using the dynamic
cavity method on time trajectories, we construct a general procedure for
deriving the dynamic message-passing equations for a large class of models with
unidirectional dynamics, which includes the zero-temperature random field Ising
model, the susceptible-infected-recovered model, and rumor spreading models. We
show that unidirectionality of the dynamics is the key ingredient that makes
the problem solvable. These equations are applicable to single instances of the
corresponding problems with arbitrary initial conditions, and are
asymptotically exact for problems defined on locally tree-like graphs. When
applied to real-world networks, they generically provide a good analytic
approximation of the real dynamics.Comment: Final versio
Online Learning of Power Transmission Dynamics
We consider the problem of reconstructing the dynamic state matrix of
transmission power grids from time-stamped PMU measurements in the regime of
ambient fluctuations. Using a maximum likelihood based approach, we construct a
family of convex estimators that adapt to the structure of the problem
depending on the available prior information. The proposed method is fully
data-driven and does not assume any knowledge of system parameters. It can be
implemented in near real-time and requires a small amount of data. Our learning
algorithms can be used for model validation and calibration, and can also be
applied to related problems of system stability, detection of forced
oscillations, generation re-dispatch, as well as to the estimation of the
system state.Comment: 7 pages, 4 figure
Learning Energy-Based Representations of Quantum Many-Body States
Efficient representation of quantum many-body states on classical computers
is a problem of enormous practical interest. An ideal representation of a
quantum state combines a succinct characterization informed by the system's
structure and symmetries, along with the ability to predict the physical
observables of interest. A number of machine learning approaches have been
recently used to construct such classical representations [1-6] which enable
predictions of observables [7] and account for physical symmetries [8].
However, the structure of a quantum state gets typically lost unless a
specialized ansatz is employed based on prior knowledge of the system [9-12].
Moreover, most such approaches give no information about what states are easier
to learn in comparison to others. Here, we propose a new generative
energy-based representation of quantum many-body states derived from Gibbs
distributions used for modeling the thermal states of classical spin systems.
Based on the prior information on a family of quantum states, the energy
function can be specified by a small number of parameters using an explicit
low-degree polynomial or a generic parametric family such as neural nets, and
can naturally include the known symmetries of the system. Our results show that
such a representation can be efficiently learned from data using exact
algorithms in a form that enables the prediction of expectation values of
physical observables. Importantly, the structure of the learned energy function
provides a natural explanation for the hardness of learning for a given class
of quantum states